On an Inequality of Karlin and Rinott Concerning Weighted Sums of i.i.d. Random Variables

نویسنده

  • Yaming Yu
چکیده

It is convenient to define H(X) = Hα(X) = −∞ when X is discrete, e.g., degenerate. (Our notation differs from that of Karlin and Rinott 1981 here.) We study the entropy of a weighted sum, S = ∑n i=1 aiXi, of i.i.d. random variables Xi, assuming that the density f of Xi is log-concave, i.e., supp(f) = {x : f(x) > 0} is an interval and log f is a concave function on supp(f). The main result is that H(S) (or Hα(S) with 0 < α < 1) is smaller when the weights a1, . . . , an are more “uniform” in the sense of majorization. A real vector b = (b1, . . . , bn) ⊤ is said to majorize a = (a1, . . . , an) , denoted a ≺ b, if there exists a doubly stochastic matrix T , i.e., an n × n matrix (tij) where tij ≥ 0, ∑

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Strong Laws for Weighted Sums of Negative Dependent Random Variables

In this paper, we discuss strong laws for weighted sums of pairwise negatively dependent random variables. The results on i.i.d case of Soo Hak Sung [9] are generalized and extended.

متن کامل

Strong Convergence of Weighted Sums for Negatively Orthant Dependent Random Variables

We discuss in this paper the strong convergence for weighted sums of negatively orthant dependent (NOD) random variables by generalized Gaussian techniques. As a corollary, a Cesaro law of large numbers of i.i.d. random variables is extended in NOD setting by generalized Gaussian techniques.

متن کامل

Complete Convergence and Some Maximal Inequalities for Weighted Sums of Random Variables

Let  be a sequence of arbitrary random variables with  and , for every  and  be an array of real numbers. We will obtain two maximal inequalities for partial sums and weighted sums of random variables and also, we will prove complete convergence for weighted sums , under some conditions on  and sequence .

متن کامل

An entropy inequality for symmetric random variables

We establish a lower bound on the entropy of weighted sums of (possibly dependent) random variables (X1, X2, . . . , Xn) possessing a symmetric joint distribution. Our lower bound is in terms of the joint entropy of (X1, X2, . . . , Xn). We show that for n ≥ 3, the lower bound is tight if and only if Xi’s are i.i.d. Gaussian random variables. For n = 2 there are numerous other cases of equality...

متن کامل

The Almost Sure Convergence for Weighted Sums of Linear Negatively Dependent Random Variables

In this paper, we generalize a theorem of Shao [12] by assuming that is a sequence of linear negatively dependent random variables. Also, we extend some theorems of Chao [6] and Thrum [14]. It is shown by an elementary method that for linear negatively dependent identically random variables with finite -th absolute moment the weighted sums converge to zero as where and is an array of...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/0909.4126  شماره 

صفحات  -

تاریخ انتشار 2009